Console Output

Training and evaluating model for: Tablet
Dataset length: 12477 windows


 NILMModel(
  (conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
  (lstm): LSTM(9, 128, num_layers=3, batch_first=True, dropout=0.1)
  (dropout): Dropout(p=0.1, inplace=False)
  (relu): ReLU()
  (output_layer): Linear(in_features=128, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.063036
Validation Loss: 0.017157
Epoch [2/300], Train Loss: 0.017167
Validation Loss: 0.015699
Epoch [3/300], Train Loss: 0.016480
Validation Loss: 0.015056
Epoch [4/300], Train Loss: 0.015171
Validation Loss: 0.012594
Epoch [5/300], Train Loss: 0.012108
Validation Loss: 0.011083
Epoch [6/300], Train Loss: 0.011031
Validation Loss: 0.010244
Epoch [7/300], Train Loss: 0.010083
Validation Loss: 0.009413
Epoch [8/300], Train Loss: 0.009628
Validation Loss: 0.009206
Epoch [9/300], Train Loss: 0.009481
Validation Loss: 0.009121
Epoch [10/300], Train Loss: 0.009225
Validation Loss: 0.008823
Epoch [11/300], Train Loss: 0.009132
Validation Loss: 0.008821
Epoch [12/300], Train Loss: 0.008992
Validation Loss: 0.008622
Epoch [13/300], Train Loss: 0.008856
Validation Loss: 0.008735
Epoch [14/300], Train Loss: 0.008803
Validation Loss: 0.008562
Epoch [15/300], Train Loss: 0.008708
Validation Loss: 0.008615
Epoch [16/300], Train Loss: 0.008665
Validation Loss: 0.008383
Epoch [17/300], Train Loss: 0.008726
Validation Loss: 0.008392
Epoch [18/300], Train Loss: 0.008514
Validation Loss: 0.008463
Epoch [19/300], Train Loss: 0.008492
Validation Loss: 0.008266
Epoch [20/300], Train Loss: 0.008374
Validation Loss: 0.008131
Epoch [21/300], Train Loss: 0.008327
Validation Loss: 0.008145
Epoch [22/300], Train Loss: 0.008280
Validation Loss: 0.008150
Epoch [23/300], Train Loss: 0.008209
Validation Loss: 0.008050
Epoch [24/300], Train Loss: 0.008189
Validation Loss: 0.007981
Epoch [25/300], Train Loss: 0.008372
Validation Loss: 0.008116
Epoch [26/300], Train Loss: 0.008102
Validation Loss: 0.008150
Epoch [27/300], Train Loss: 0.008088
Validation Loss: 0.007933
Epoch [28/300], Train Loss: 0.008228
Validation Loss: 0.007943
Epoch [29/300], Train Loss: 0.008000
Validation Loss: 0.007860
Epoch [30/300], Train Loss: 0.007997
Validation Loss: 0.007828
Epoch [31/300], Train Loss: 0.008057
Validation Loss: 0.007857
Epoch [32/300], Train Loss: 0.007931
Validation Loss: 0.007820
Epoch [33/300], Train Loss: 0.007959
Validation Loss: 0.007986
Epoch [34/300], Train Loss: 0.007889
Validation Loss: 0.007985
Epoch [35/300], Train Loss: 0.009103
Validation Loss: 0.009503
Epoch [36/300], Train Loss: 0.008171
Validation Loss: 0.007893
Epoch [37/300], Train Loss: 0.007889
Validation Loss: 0.007846
Epoch [38/300], Train Loss: 0.007974
Validation Loss: 0.007941
Epoch [39/300], Train Loss: 0.007814
Validation Loss: 0.007843
Epoch [40/300], Train Loss: 0.007774
Validation Loss: 0.007750
Epoch [41/300], Train Loss: 0.007777
Validation Loss: 0.007768
Epoch [42/300], Train Loss: 0.007952
Validation Loss: 0.009471
Epoch [43/300], Train Loss: 0.008230
Validation Loss: 0.008090
Epoch [44/300], Train Loss: 0.007781
Validation Loss: 0.007700
Epoch [45/300], Train Loss: 0.007681
Validation Loss: 0.007617
Epoch [46/300], Train Loss: 0.007669
Validation Loss: 0.007680
Epoch [47/300], Train Loss: 0.007645
Validation Loss: 0.007595
Epoch [48/300], Train Loss: 0.007596
Validation Loss: 0.007597
Epoch [49/300], Train Loss: 0.007596
Validation Loss: 0.007630
Epoch [50/300], Train Loss: 0.008103
Validation Loss: 0.007807
Epoch [51/300], Train Loss: 0.007692
Validation Loss: 0.007630
Epoch [52/300], Train Loss: 0.007610
Validation Loss: 0.007765
Epoch [53/300], Train Loss: 0.007603
Validation Loss: 0.007627
Epoch [54/300], Train Loss: 0.007565
Validation Loss: 0.007590
Epoch [55/300], Train Loss: 0.007530
Validation Loss: 0.007532
Epoch [56/300], Train Loss: 0.007469
Validation Loss: 0.007526
Epoch [57/300], Train Loss: 0.007451
Validation Loss: 0.007466
Epoch [58/300], Train Loss: 0.007423
Validation Loss: 0.007490
Epoch [59/300], Train Loss: 0.007412
Validation Loss: 0.007474
Epoch [60/300], Train Loss: 0.007407
Validation Loss: 0.007600
Epoch [61/300], Train Loss: 0.007384
Validation Loss: 0.007495
Epoch [62/300], Train Loss: 0.007359
Validation Loss: 0.007407
Epoch [63/300], Train Loss: 0.007302
Validation Loss: 0.007390
Epoch [64/300], Train Loss: 0.007290
Validation Loss: 0.007329
Epoch [65/300], Train Loss: 0.007360
Validation Loss: 0.007296
Epoch [66/300], Train Loss: 0.007212
Validation Loss: 0.007256
Epoch [67/300], Train Loss: 0.007197
Validation Loss: 0.007267
Epoch [68/300], Train Loss: 0.007093
Validation Loss: 0.007164
Epoch [69/300], Train Loss: 0.007007
Validation Loss: 0.006997
Epoch [70/300], Train Loss: 0.006964
Validation Loss: 0.006963
Epoch [71/300], Train Loss: 0.006817
Validation Loss: 0.006918
Epoch [72/300], Train Loss: 0.006815
Validation Loss: 0.006776
Epoch [73/300], Train Loss: 0.006790
Validation Loss: 0.012056
Epoch [74/300], Train Loss: 0.008259
Validation Loss: 0.007715
Epoch [75/300], Train Loss: 0.007557
Validation Loss: 0.007307
Epoch [76/300], Train Loss: 0.006984
Validation Loss: 0.007057
Epoch [77/300], Train Loss: 0.006902
Validation Loss: 0.006918
Epoch [78/300], Train Loss: 0.006828
Validation Loss: 0.006906
Epoch [79/300], Train Loss: 0.006768
Validation Loss: 0.006794
Epoch [80/300], Train Loss: 0.006736
Validation Loss: 0.006739
Epoch [81/300], Train Loss: 0.006705
Validation Loss: 0.006826
Epoch [82/300], Train Loss: 0.006682
Validation Loss: 0.006660
Epoch [83/300], Train Loss: 0.006614
Validation Loss: 0.006784
Epoch [84/300], Train Loss: 0.006574
Validation Loss: 0.006579
Epoch [85/300], Train Loss: 0.006549
Validation Loss: 0.006573
Epoch [86/300], Train Loss: 0.006695
Validation Loss: 0.007600
Epoch [87/300], Train Loss: 0.007899
Validation Loss: 0.006744
Epoch [88/300], Train Loss: 0.006712
Validation Loss: 0.006624
Epoch [89/300], Train Loss: 0.006590
Validation Loss: 0.006558
Epoch [90/300], Train Loss: 0.006537
Validation Loss: 0.006565
Epoch [91/300], Train Loss: 0.006511
Validation Loss: 0.006544
Epoch [92/300], Train Loss: 0.006477
Validation Loss: 0.006529
Epoch [93/300], Train Loss: 0.006474
Validation Loss: 0.006616
Epoch [94/300], Train Loss: 0.006470
Validation Loss: 0.006560
Epoch [95/300], Train Loss: 0.006454
Validation Loss: 0.006512
Epoch [96/300], Train Loss: 0.011658
Validation Loss: 0.014191
Epoch [97/300], Train Loss: 0.010775
Validation Loss: 0.007114
Epoch [98/300], Train Loss: 0.006931
Validation Loss: 0.006788
Epoch [99/300], Train Loss: 0.006831
Validation Loss: 0.006806
Epoch [100/300], Train Loss: 0.006761
Validation Loss: 0.006668
Epoch [101/300], Train Loss: 0.006713
Validation Loss: 0.006669
Epoch [102/300], Train Loss: 0.006671
Validation Loss: 0.006625
Epoch [103/300], Train Loss: 0.006662
Validation Loss: 0.006660
Epoch [104/300], Train Loss: 0.006640
Validation Loss: 0.006566
Epoch [105/300], Train Loss: 0.006588
Validation Loss: 0.006587
Early stopping triggered

Evaluating model for: Tablet
Validation MAE: 0.601728 W
Validation MSE: 0.944910 W²
Validation RMSE: 0.972065 W
Signal Aggregate Error (SAE): 0.007844
Normalized Disaggregation Error (NDE): 0.223361

      

Training and Validation Loss

Training Loss Plot

Interactive Plot